- General metamodel engineering approach
- Optimization basics & engineering context
- Inversion basics & engineering context
- Improved numerical engineering algorithms & models
Y. Richet (yann.richet@irsn.fr)
credits: N. Durrande, R. Le Riche, V. Picheny, N. Garland
2018
The Unreasonable Effectiveness of Mathematics, E. Wigner
"I am an old man now, and when I die and go to heaven there are two matters on which I hope for enlightenment.
One is quantum electrodynamics, and the other is the turbulent motion of fluids.
And about the former I am rather optimistic.", H. Lamb
Industrial products:
Regulatory control:
for sake of simplicity and by convention, we consider minimization
Gradient / Newton methods
Evolutionnary algorithms (CMA-ES, PSO, …)
Optimization in engineering context raises many issues:
But basic methods are not suitable:
Optimization in engineering context raises many issues:
But basic methods are not suitable
So, true practice is often very rough:
One-at-a-time optimization
Bayesian optimization (EGO,…)
To limit number of evaluations:
To limit number of evaluations:
But it is risky to take decision only based on a single model …
… However, we would expect the minimum to be not so far from model's one.
To limit number of evaluations:
But it is risky to take decision only based on a model …
=> Diversifiy the model
… However, we would expect the minimum to be not so far from model's one.
=> Take model minimum just as a clue
To limit number of evaluations:
But it is risky to take decision only based on a model …
=> Diversifiy the model: [EXPLORE]
… However, we would expect the minimum to be not so far from model's one.
=> Take model minimum just as a clue: [EXPLOIT]
To limit number of evaluations:
To trade-off between exploration/exploitation we will consider:
Let's define the Probability of Improvement:
which is analytical thanks to properties…
But, for highest is often close to best …
Let's define the Expected Improvement:
which is (also) analytical thanks to properties…
"Efficient Global Optimization of Expensive Black-Box Functions"- Jones, Schonlau, Welch,
(Journal of Global Optimization, December 1998)
EGO:
Maximize (*), compute there, add to .
Repeat until …
(*) using standard optimization algorithm: BFGS, PSO, DiRect, …
Support for parallel evaluations of
EGO++:
Let's define the Expected Excursion:
which is (also) analytical thanks to properties…
"Sequential design of computer experiments for the estimation of a probability of failure" - Bect, Ginsbourger, Li, Picheny, Vazquez, (Statistics and Computing 2012)
EGI:
Maximize (*), compute there, add to .
Repeat until …
(*) using standard optimization algorithm: BFGS, PSO, DiRect, …
Exploration >> exploitation, which is interesting for identification problems, where discrete solution never apply (like inversion).
Let's define the Probability of Excursion:
… and the Uncertainty of Excursion:
KrigInv: An efficient and user-friendly implementation of batch-sequential inversion strategies based on Kriging
Implements
Instead of a local criterion (like ),
use a global gain which integrates local criterion:
i.e. We search for which, once added to , most reduces .
Instead of a local criterion (like ),
use a global gain which integrates local criterion:
i.e. We search for which, once added to , most reduces .
SUR criterions are:
Basically using :
… but this excludes to sample if .
Or using :
Languages:
Libraries:
State-of-the-Art algorithms: EGO, CMA-ES, NSGA2, PSO
(C, C++, Fortran, Matlab/Octave, Python, R, Scilab, Julia, …)
Languages:
Libraries: